16 research outputs found

    Software architecture for modeling and distributing virtual environments

    Get PDF

    Fidelity metrics for virtual environment simulations based on spatial memory awareness states

    Get PDF
    This paper describes a methodology based on human judgments of memory awareness states for assessing the simulation fidelity of a virtual environment (VE) in relation to its real scene counterpart. To demonstrate the distinction between task performance-based approaches and additional human evaluation of cognitive awareness states, a photorealistic VE was created. Resulting scenes displayed on a headmounted display (HMD) with or without head tracking and desktop monitor were then compared to the real-world task situation they represented, investigating spatial memory after exposure. Participants described how they completed their spatial recollections by selecting one of four choices of awareness states after retrieval in an initial test and a retention test a week after exposure to the environment. These reflected the level of visual mental imagery involved during retrieval, the familiarity of the recollection and also included guesses, even if informed. Experimental results revealed variations in the distribution of participantsā€™ awareness states across conditions while, in certain cases, task performance failed to reveal any. Experimental conditions that incorporated head tracking were not associated with visually induced recollections. Generally, simulation of task performance does not necessarily lead to simulation of the awareness states involved when completing a memory task. The general premise of this research focuses on how tasks are achieved, rather than only on what is achieved. The extent to which judgments of human memory recall, memory awareness states, and presence in the physical and VE are similar provides a fidelity metric of the simulation in question

    Update rates and fidelity in virtual environments

    No full text
    Interaction is the primary characteristic of a Virtual Environment and update rate is normally taken as an index or measure of the interactivity of the system. The speed of many systems is dictated by the slowest component which is often the Computer Image Generator (CIG). It is common for the workload of the CIG to vary and hence the performance of the system. This paper shows how a variable update rate can produce undesirable results. Two solutions to this problem are presented: service degradation and worst-case. In the case of the CIG, service degradation would require the quality of the image to be reduced such that the time taken never exceeds a given deadline. The worst-case technique works by finding the longest time taken to render any view and then uses that as the deadline for completion. The support of predictive methods is one of several benefits of this approach. An implementation of the worst-case technique is described which takes finer control over the CIG than usual and may be applied to many existing systems with little modification

    Rycharde Hawkes

    No full text
    This report is going to be divided in three separate stages. STAGE 1 represents the initial investigations based on the generic goals stated above. STAGE 2 will present the revised directions and first experimental results and STAGE 3 will describe the final, formally designed experimental studies which formed the core of this project. STAGE 2 and 3 resulted in Katerina Mania's Ph.D. thesis submitted in June 2001. Her Ph.D. degree was awarded in October 200

    Jumbo Store: Providing Efficient Incremental Upload and Versioning for a Utility Rendering Service

    No full text
    We have developed a new storage system called the Jumbo Store (JS) based on encoding directory tree snapshots as graphs called HDAGs whose nodes are small variable-length chunks of data and whose edges are hash pointers. We store or transmit each node only once and encode using landmark-based chunking plus some new tricks. This leads to very efficient incremental upload and storage of successive snapshots: we report compression factors over 16x for real data; a comparison shows that our incremental upload sends only 1/5 as much data as Rsync. To demonstrate the utility of the Jumbo Store, we have integrated it into HP Labs ā€™ prototype Utility Rendering Service (URS), which accepts rendering data in the form of directory tree snapshots from small teams of animators, renders one or more requested frames using a processor farm, and then makes the rendered frames available for download. Efficient incremental upload is crucial to the URSā€™s usability and responsiveness because of the teams ā€™ slow Internet connections. We report on the JSā€™s performance during a major field test of the URS where the URS was offered to 11 groups of animators for 10 months during an animation showcase to create high-quality short animations.

    Assessing functional realism

    No full text
    This research presents an innovative method for assessing ā€˜functional realismā€™ of interactive Virtual Environments (VEs) in which the same information is transmitted in real and synthetic scenes. The basic premise is that an individualā€™s prior experience will influence how one perceives, comprehends and remembers information in a scene. 120 participants across two conditions of varying rendering quality of a space including varied ratios of objectsā€™ association to the scene context, are being exposed to the VE and complete an object-based memory recognition task. The results of this study could have significant implications while identifying areas of an interactive computer graphics scene that require varying quality of rendering
    corecore